This portfolio showcases work from PSYCH 403A1: Neuroimaging & Neurostimulation. Browse the assignments below to see the complete coursework.
Course Portfolio
Portfolio Overview
Other Course Work
📊 Assignment 1: EEG Analysis
📓 Jia Guina - assignment1_eeg_filtering - Jia Guina.ipynb
Jupyter Notebook💡 Opens in Google Colab for interactive execution. Requires Google account.
📝 Neon Ribbon Waves - Alpha Response Visualization
Description: This visualization creates expanding neon ribbon waves that respond to alpha wave activity. Features glowing, multi-layered circular ribbons with wavy edges that pulse and rotate, creating a mesmerizing kaleidoscope effect with shifting neon colors.
🎨 Assignment 2: BrainImation
📄 PSYCH403 - Assignment 2 Writeup & Code - Google Docs - Jia Guina.pdf
View Original PDFWRITEUP : The concept of my brain art is inspired by “Resonance”. Resonance is an immersive audio-visual experience. It reacts to differences in how our brains respond to environmental stimuli. I searched up “brain art that detects and responds to certain brain states”. Since I thought that concept would be a better fit when it came to expanding on it for the midterm. I recognized “Resonance” looked very similar to alpha waves, so I started with that. I knew I wanted to change the shape of the waves from perfect circles to something that looked more like actual waves. I wanted to go for a modern interpretation of brainwave activity. Something that doesn’t just show data but feels like what’s happening in the brain, as weird as that sounds. The glowing waves and wavy circular forms represent both the rhythm and complexity of neural patterns while keeping a visually captivating and contemporary style.The subtle rotation and the ripple motion in each wave are meant to make the visual feel like it’s breathing, emphasizing the connection between human brain activity and living movement. The original alpha wave code from the example kind of had a glowing “effect,” but it only looked that way because the waves would ripple and clump together, creating an illusion of glow. I wanted to create a more intentional, separate glow effect around the alpha waves. The new code works a bit differently. Instead of using colorMode(HSB), which changes how p5.js interprets colour values, I switched to using HSL colour strings like in CSS (for example, color("hsl(200, 90%, 70%)")). This made it easier to choose specific hues and brightness levels, allowing me to create more vibrant neon tones, like glowing blues and purples. Each wave now stores more information, not just its position, size, and transparency, but also two new properties: phase, which controls the wavy motion so each ring ripples differently, and colorOffset, which gives each ring its own hue shift, making the colours shimmer instead of all looking the same. The drawWavyCircle() function loops around a circle and slightly wiggles its edge using a sine wave (sin(a * 8 + phase)), creating rippled, wavy shapes that look more like energy waves or liquid rings. On top of that, the new drawNeonRibbon() function draws each circle multiple times in layers, so each layer is thicker and more transparent, creating a soft glow with a bright “core line” in the center. This layering gives the illusion of neon light or bioluminescence. The animation also behaves differently now: each wave expands faster (w.r += w.alpha * 12 + 2) and rotates slightly over time (w.phase += 0.03), making the ripples move fluidly instead of just growing outward, again for that “breathing” and relaxing effect. Some challenges I faced was that it was hard to make changes to the code that changed the visual “enough” for it to actually look cool. At first I would start with an example on the brainimation platform and just change the numbers of things (radius, number of shapes, sizes of things or the hues). But that didn’t look cool enough for me and didn’t look like that visual would make sense being related to brain waves. It was also challenging to transfer the code from p5.js to brainimation because p5.js doesn’t recognize the eeg data sets so there ends up being an error on brainimation, but the visual is running fine on p5.js because its given a “random” data set. What I learned was that coding is quite hard. For example I had an idea of what I wanted to change with the code but I had to search up how that would look like. Even then,
understanding things like “calling functions” and the indentation of the code is super
important in making sure it all runs smoothly. I also noticed the code in brainimation had “//”
and they seem to act like notes. Although I do know in python “#” is the symbol for notes.
When I tried to add a note using “#” in Brainimation, the text turned red and signaled an
error. So overall, I learned how small coding details can completely change how a visual
turns out, and that experimentation and problem-solving are a huge part of the creative
process.
🎯 Midterm Project
📄 guina_midterm_part1&2_writeup - Jia Guina.pdf
View Original PDFName: Jia Guina
Student ID: 1753064
MIDTERM PART 1 WRITEUP:
For the first part of the midterm, I expanded on the concept of “Resonance” from my
assignment 2. And the novel BCI concept I implemented was 3. Multi-State Classification . So
the system detects and responds to distinct mental states rather than just visualizing a single
type of brainwave. Specifically, it differentiates between eyes open vs. eyes closed, and
relaxed vs. focused vs. stressed states, using EEG frequency band analysis. Each mental
state triggers a unique visual response, allowing the brain work to dynamically reflect the
user’s brain activity in real time.
The visualization loop works by continuously reading EEG data, classifying the current
mental state based on alpha, beta, and theta activity, and mapping these states to visual
parameters such as colour, wave speed, ripple amplitude, and glow intensity. For example,
alpha spikes during eyes-closed relaxation produce slow, smooth waves with cool neon
tones, while focused or stressed states create faster, sharper, and warmer-coloured ripples.
Each wave also incorporates phase offsets and layered drawing to create wavy, glowing
ribbons, giving the effect of energy ripples or bioluminescent rings that expand, rotate, and
shimmer in a “breathing” motion.
The new code expands on the original alpha-wave visualization by detecting multiple mental
states. Eyes open/closed, relaxed, focused, and stressed and changing the waves and
background accordingly. Each state has its own colour palette, wave speed, glow intensity,
and hue shift, making the visuals more dynamic and expressive. The waves now shimmer,
expand, and rotate differently depending on the brain state, while the background gradient
changes to match the mood. Compared to the original, which used the same colour and
motion, the new version is more immersive and responsive.
From implementing this BCI, I learned that coding is still hard. Especially for real-time brain
visualization. It requires careful planning and attention to small and every detail. I also found
it hard to get the same colour on the brain waves from the original multi-channel trace. Even
when copying it into my new code it looked different every time. I tried just trying different
numbers to achieve better colours but it would just give me another shade of the same thing.
But I did gain experience in translating abstract neural data into meaningful, aesthetic
visuals, and understanding how layered drawing and colour manipulation affect perception.
But overall, the project reinforced that creativity and technical problem-solving are deeply
intertwined when building responsive, interactive brain art.
MIDTERM PART 2 WRITEUP:
In my ERP experiment, I’m measuring the N170 paradigm. Where I would show participants
alternating faces and objects. I also wanted to go with this measure because I have always
been amazed and curious about the fusiform face area (FFA) in the brain. The faces are
simple schematic drawings with eyes, a mouth, and an oval head, while the objects are
coloured rectangles. The stimuli switch every ~30 frames, and I count each presentation as
a trial. By manipulating stimulus categories (face vs. object), I can investigate
category-specific neural responses.
At the same time, the EEG records from four electrodes (TP9, AF7, AF8, TP10). In my code,
I display a real-time trace of the most recent 200 data points for each electrode, using
different colours for clarity. This gives me an immediate look at the brain activity, but the
main component I’m interested in is the N170. To measure it properly, I would segment the
EEG into epochs time-locked to stimulus onset and average across trials to create an
event-related potential (ERP). Averaging helps reduce background noise and highlights the
stimulus-locked response.
The N170 is a negative deflection that occurs around 170 ms after seeing a stimulus, mainly
over posterior temporal electrodes. It’s considered a marker of face-selective perceptual
processing and structural encoding of faces. Faces usually produce larger N170 amplitudes
than other objects because the brain is processing the configuration of facial features. If a
face is inverted, the N170 is delayed and larger, reflecting disrupted face processing.
By comparing EEG responses to faces versus objects, I can measure the N170 in my
paradigm. The real-time EEG traces help me monitor signal quality during the experiment,
while averaging across trials offline will reveal the N170 component more clearly. If it were to
be tested with real EEG, I would expect larger N170 amplitudes for faces, smaller
amplitudes for non-face objects, and the strongest responses over temporal-posterior
electrodes. Which reminds me of the FFA again. Since it is located in the temporal lobe,
N170 would make sense to be larger in the presence of faces.
Overall, this setup lets me study face-selective neural processing using the N170, combining
live EEG visualization with a controlled stimulus manipulation that supports ERP extraction
and averaging, consistent with what the literature reports.
Midterm Visualizations
📝 guina_midterm_part2 - Jia Guina.txt
💡 Code is embedded in this portfolio - opens instantly in the live BrainImation editor (no internet required!)
📝 guina_midterm_part1 - Jia Guina.txt
💡 Code is embedded in this portfolio - opens instantly in the live BrainImation editor (no internet required!)